An Efficient Descent Method for Locally Lipschitz Multiobjective Optimization Problems

نویسندگان

چکیده

In this article, we present an efficient descent method for locally Lipschitz continuous multiobjective optimization problems (MOPs). The is realized by combining a theoretical result regarding the computation of directions nonsmooth MOPs with practical to approximate subdifferentials objective functions. We show convergence points which satisfy necessary condition Pareto optimality. Using set test problems, compare our proximal bundle M\"akel\"a. results indicate that competitive while being easier implement. While number function evaluations larger, overall subgradient lower. Finally, can be combined subdivision algorithm compute entire sets MOPs.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

The proximal point method for locally Lipschitz functions in multiobjective optimization

This paper studies the constrained multiobjective optimization problem of finding Pareto critical points of vector-valued functions. The proximal point method considered by Bonnel et al. (SIAM J. Optim., 4 (2005), pp. 953-970) is extended to locally Lipschitz functions in the finite dimensional multiobjective setting. To this end, a new approach for convergence analysis of the method is propose...

متن کامل

An Efficient Conjugate Gradient Algorithm for Unconstrained Optimization Problems

In this paper, an efficient conjugate gradient method for unconstrained optimization is introduced. Parameters of the method are obtained by solving an optimization problem, and using a variant of the modified secant condition. The new conjugate gradient parameter benefits from function information as well as gradient information in each iteration. The proposed method has global convergence und...

متن کامل

An efficient improvement of the Newton method for solving nonconvex optimization problems

‎Newton method is one of the most famous numerical methods among the line search‎ ‎methods to minimize functions. ‎It is well known that the search direction and step length play important roles ‎in this class of methods to solve optimization problems. ‎In this investigation‎, ‎a new modification of the Newton method to solve ‎unconstrained optimization problems is presented‎. ‎The significant ...

متن کامل

A Free Line Search Steepest Descent Method for Solving Unconstrained Optimization Problems

In this paper, we solve unconstrained optimization problem using a free line search steepest descent method. First, we propose a double parameter scaled quasi Newton formula for calculating an approximation of the Hessian matrix. The approximation obtained from this formula is a positive definite matrix that is satisfied in the standard secant relation. We also show that the largest eigen value...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of Optimization Theory and Applications

سال: 2021

ISSN: ['0022-3239', '1573-2878']

DOI: https://doi.org/10.1007/s10957-020-01803-w